Verification

About
The Verification process provides the evidence that the system or system element performs its intended functions and meets all performance requirements listed in the system performance specification and functional and allocated baselines. Verification answers the question, "Did you build the system correctly?" Verification is a key risk-reduction activity in the implementation and integration of a system and enables the program to catch defects in system elements before integration at the next level, thereby preventing costly troubleshooting and rework.
Role of the PM and SE
The Program Manager (PM) and Systems Engineer, in coordination with the Chief Developmental Tester, manage verification activities and methods as defined in the functional and allocated baselines and review the results of verification. Guidance for managing and coordinating integrated testing activities can be found in Test & Evaluation Enterprise Guidebook and in DoDI 5000.89, Test and Evaluation.
Activities
Verification begins during Requirements Analysis, when top-level stakeholder performance requirements are decomposed and eventually allocated to system elements in the initial system performance specification and interface control specifications. During this process, the program determines how and when each requirement should be verified and the tasks required to do so, as well as the necessary resources (i.e., test equipment, range time, personnel, etc.). The resulting verification matrix and supporting documentation become part of the functional and allocated baselines.
Verification may be accomplished by any combination of the following methods:
- Demonstration. Demonstration is the performance of operations at the system or system element level where visual observations are the primary means of verification. Demonstration is used when quantitative assurance is not required for the verification of the requirements.
- Examination. Visual inspection of equipment and evaluation of drawings and other pertinent design data and processes should be used to verify conformance with characteristics such as physical, material, part and product marking and workmanship.
- Analysis. Analysis is the use of recognized analytic techniques (including computer models) to interpret or explain the behavior/performance of the system element. Analysis of test data or review and analysis of design data should be used as appropriate to verify requirements.
- Test. Test is an activity designed to provide data on functional features and equipment operation under fully controlled and traceable conditions. The data are subsequently used to evaluate quantitative characteristics.
Designs are verified at all levels of the physical architecture through a cost-effective combination of these methods, all of which can be aided by modeling and simulation.
Verification activities and results are documented among the artifacts for Functional Configuration Audits (FCA) and the System Verification Review (SVR) (see SE Guidebook Section 3.6). When possible, verification should stress the system, or system elements, under realistic conditions representative of its intended use.
The individual system elements provided by the Implementation process are verified through developmental test and evaluation (DT&E), acceptance testing or qualification testing. During the Integration process, the successively higher-level system elements may be verified before they move on to the next level of integration. Verification of the system as a whole occurs when integration is complete. As design changes occur, each change should be assessed for potential impact to the qualified baseline. This may include a need to repeat portions of verification in order to mitigate risk of performance degradation.
Outputs
The output of the Verification process is a verified production-representative article with documentation to support Initial Operational Test and Evaluation (IOT&E). The SVR provides a determination of the extent to which the system meets the system performance specification.
Products and Tasks
Product | Tasks |
---|---|
AWQI 7-1-1: Design and implement a testing process to compare actual performance with required system/item performance |
|
AWQI 7-2-1: Verify system compliance with defined physical architecture |
|
Source: AWQI eWorkbook
Resources
Key Terms
- Developmental Test Adequacy Evaluation
- Developmental Test & Evaluation (DT&E)
- Developmental Test Adequacy Evaluation (aka Developmental Test Assessment)
- Developmental Test and Evaluation (DT&E)
- Functional Configuration Audit
- Inspection
- Prototype Models
- System Verification Review
- Test & Evaluation Master Plan (TEMP)
- Test Readiness Review (TRR)
- Testing
- Verification
- Verification, Validation, and Accreditation
Source:
DAU ACQuipedia
DAU Glossary
Policy and Guidance
- SE Guidebook Section 4.2.6. Verification Process
- DoDI 5000.89, Test and Evaluation, Section 5, DT&E
- DoD Cybersecurity Test and Evaluation Guidebook
- DoD Cybersecurity Test and Evaluation Guidebook, Addendum
- IEEE 15288.1-2014 Section 6.4.9 Verification process
- SEBoK ??? System Verification
- Systems Engineering Plan (SEP) Outline Section 2.1 Requirements Development
- Test and Evaluation Enterprise Guidebook
DAU Training Courses
- TST 102: Fundamentals of Test and Evaluation
- TST 2040V: Test and Evaluation for Practitioners
- LOG 0030: Supportability Test & Evaluation
- TST 1100 :Intro to Systems Engineering for Testers
- TST 2100 :Applied Systems Engineering for Testers
- CLE 023: Modeling and Simulation in Test and Evaluation
- CLE 084: Models, Simulations, and Digital Engineering
- WSE 006 :Engineering Management Workshop (EMW)
- WSE 026 :Test & Evaluation Across the Acquisition Life Cycle
- WSS 013 :Cyber Training Range ??? Aviation Systems
DAU Tools
Media
- Developmental and Operational Testing
- Major Programs: Test & Evaluation
- Test and Evaluation playlist
DAU Communities of Practice
- DAU Test and Evaluation Community of Practice